Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives

نویسندگان

چکیده

This paper introduces Ranking Info Noise Contrastive Estimation (RINCE), a new member in the family of InfoNCE losses that preserves ranked ordering positive samples. In contrast to standard loss, which requires strict binary separation training pairs into similar and dissimilar samples, RINCE can exploit information about similarity ranking for learning corresponding embedding space. We show proposed loss function learns favorable embeddings compared whenever at least noisy be obtained or when definition positives negatives is blurry. demonstrate this supervised classification task with additional superclass labels scores. Furthermore, we also applied unsupervised experiments on representation from videos. particular, yields higher accuracy, retrieval rates performs better out-of-distribution detection than loss.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Probabilistic Submodular Diversity Models Via Noise Contrastive Estimation

Modeling diversity of sets of items is important in many applications such as product recommendation and data summarization. Probabilistic submodular models, a family of models including the determinantal point process, form a natural class of distributions, encouraging effects such as diversity, repulsion and coverage. Current models, however, are limited to small and medium number of items du...

متن کامل

Learning Contrastive Connectives in Sentence Realization Ranking

We look at the average frequency of contrastive connectives in the SPaRKy Restaurant Corpus with respect to realization ratings by human judges. We implement a discriminative n-gram ranker to model these ratings and analyze the resulting n-gram weights to determine if our ranker learns this distribution. Surprisingly, our ranker learns to avoid contrastive connectives. We look at possible expla...

متن کامل

Collective Noise Contrastive Estimation for Policy Transfer Learning

We address the problem of learning behaviour policies to optimise online metrics from heterogeneous usage data. While online metrics, e.g., click-through rate, can be optimised effectively using exploration data, such data is costly to collect in practice, as it temporarily degrades the user experience. Leveraging related data sources to improve online performance would be extremely valuable, b...

متن کامل

Learning word embeddings efficiently with noise-contrastive estimation

Continuous-valued word embeddings learned by neural language models have recently been shown to capture semantic and syntactic information about words very well, setting performance records on several word similarity tasks. The best results are obtained by learning high-dimensional embeddings from very large quantities of data, which makes scalability of the training method a critical factor. W...

متن کامل

Improving Language Modelling with Noise-contrastive estimation

Neural language models do not scale well when the vocabulary is large. Noise contrastive estimation (NCE) is a sampling-based method that allows for fast learning with large vocabularies. Although NCE has shown promising performance in neural machine translation, its full potential has not been demonstrated in the language modelling literature. A sufficient investigation of the hyperparameters ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2022

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v36i1.19972